نتایج جستجو برای: Rényi entropy

تعداد نتایج: 70986  

The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...

Journal: :CoRR 2012
Marek Smieja Jacek Tabor

Rényi entropy of order α is a general measure of entropy. In this paper we derive estimations for the Rényi entropy of the mixture of sources in terms of the entropy of the single sources. These relations allow to compute the Rényi entropy dimension of arbitrary order of a mixture of measures. The key for obtaining these results is our new definition of the weighted Rényi entropy. It is shown t...

2008
S. Baratpour J. Ahmadi N. R. Arghami

Two different distributions may have equal Rényi entropy; thus a distribution cannot be identified by its Rényi entropy. In this paper, we explore properties of the Rényi entropy of order statistics. Several characterizations are established based on the Rényi entropy of order statistics and record values. These include characterizations of a distribution on the basis of the differences between...

Journal: :Quantum Information Processing 2016
Frédéric Dupuis Mark M. Wilde

This paper introduces “swiveled Rényi entropies” as an alternative to the Rényi entropic quantities put forward in [Berta et al., Physical Review A 91, 022333 (2015)]. What distinguishes the swiveled Rényi entropies from the prior proposal of Berta et al. is that there is an extra degree of freedom: an optimization over unitary rotations with respect to particular fixed bases (swivels). A conse...

2010
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

2015
Sergio Verdú

Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...

Journal: :Journal of High Energy Physics 2013

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

2015
Wu - Zhong Guo Song He

We study Rényi entropy of locally excited states with considering the thermal and boundary effects respectively in two dimensional conformal field theories (CFTs). Firstly, we consider locally excited states obtained by acting primary operators on a thermal state in low temperature limit. The Rényi entropy is summation of contribution from thermal effect and local excitation. Secondly, we mainl...

2017
Igal Sason Sergio Verdú

This paper gives upper and lower bounds on the minimum error probability of Bayesian M -ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano’s...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید